Diversity creation in local search for the evolution of neural network ensembles
نویسندگان
چکیده
The EENCL algorithm [1] automatically designs neural network ensembles for classification, combining global evolution with local search based on gradient descent. Two mechanisms encourage diversity: Negative Correlation Learning (NCL) and implicit fitness sharing. This paper analyses EENCL, finding that NCL is not an essential component of the algorithm, while implicit fitness sharing is. Furthermore, we find that a local search based on independent training is equally effective in both accuracy and diversity. We propose that NCL is unnecessary in EENCL for the tested datasets, and that complementary diversity in local search and global evolution may lead to better ensembles.
منابع مشابه
A Differential Evolution and Spatial Distribution based Local Search for Training Fuzzy Wavelet Neural Network
Abstract Many parameter-tuning algorithms have been proposed for training Fuzzy Wavelet Neural Networks (FWNNs). Absence of appropriate structure, convergence to local optima and low speed in learning algorithms are deficiencies of FWNNs in previous studies. In this paper, a Memetic Algorithm (MA) is introduced to train FWNN for addressing aforementioned learning lacks. Differential Evolution...
متن کاملMulti-objective Differential Evolution for the Flow shop Scheduling Problem with a Modified Learning Effect
This paper proposes an effective multi-objective differential evolution algorithm (MDES) to solve a permutation flow shop scheduling problem (PFSSP) with modified Dejong's learning effect. The proposed algorithm combines the basic differential evolution (DE) with local search and borrows the selection operator from NSGA-II to improve the general performance. First the problem is encoded with a...
متن کاملTraining Radial Basis Function Neural Network using Stochastic Fractal Search Algorithm to Classify Sonar Dataset
Radial Basis Function Neural Networks (RBF NNs) are one of the most applicable NNs in the classification of real targets. Despite the use of recursive methods and gradient descent for training RBF NNs, classification improper accuracy, failing to local minimum and low-convergence speed are defects of this type of network. In order to overcome these defects, heuristic and meta-heuristic algorith...
متن کاملآموزش شبکه عصبی MLP در فشردهسازی تصاویر با استفاده از روش GSA
Image compression is one of the important research fields in image processing. Up to now, different methods are presented for image compression. Neural network is one of these methods that has represented its good performance in many applications. The usual method in training of neural networks is error back propagation method that its drawbacks are late convergence and stopping in points of lo...
متن کاملEnsemble strategies to build neural network to facilitate decision making
There are three major strategies to form neural network ensembles. The simplest one is the Cross Validation strategy in which all members are trained with the same training data. Bagging and boosting strategies pro-duce perturbed sample from training data. This paper provides an ideal model based on two important factors: activation function and number of neurons in the hidden layer and based u...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2006